专利摘要:
SYSTEMS AND METHODS FOR CAPTURING LARGE AREA IMAGES IN DETAILS INCLUDING CASCADE CAMERAS AND / OR CALIBRATION ASPECTS.The present invention relates to a method and system in which images are captured from general and detailed image generation devices, so that general images are created with a first degree of redundancy, and detail images are captured with less overlap and with a second degree of redundancy.
公开号:BR112012006577A2
申请号:R112012006577-7
申请日:2010-09-22
公开日:2020-10-13
发明作者:Stuart Willian Nixon
申请人:Nearmap Pty Ltd;
IPC主号:
专利说明:

'Invention Patent Descriptive Report for "SYSTEMS AND
METHODS FOR CAPTURING LARGE AREA IMAGES IN ”DETAILS INCLUDING CASCADE CAMERAS AND / OR CALIBRATION ASPECTS”. Cross Reference to Related Request This application is partly a continuation of US Patent Application 12 / 101,167, filed on April 11, April 2008, and called Systems and Methods of Capturing Large Area Images in Detail Including Cascaded Cameras and / or Calibration Features, the description of which in its entirety is incorporated - * 10 in this document by reference. Copyright Notice and Authorization 'Parts of the documentation in this document patent documents contain material that is subject to copyright protection.The copyright owner has no objection to facsimile reproduction by anyone of the patent document or patent description as it is published in the Patent file or records and Trademark Office, but otherwise reserves all copyrights whatever they may be. Brief Description of Drawings In drawings: figure 1 is a block diagram of an illustrative system for capturing overview and detail images; figures 2A and 2B are block diagrams of other illustrative systems for capturing overview and detail images; figure 3 is a block diagram of another illustrative system for capturing general and detail images; figure 4 is a diagram representing a camera compartment system; figure 5A illustrates an illustrative implementation including an external compartment mounted on a small single engine aircraft figure 5B illustrates an illustrative implementation of image capture subsystems mounted inside an external compartment;
. figure 5C illustrates the illustrative use of an aircraft to collect general and detail image data; ”Figure 5D illustrates a flight plan for collecting overview and detail images; figures 6A and 6B are diagrams illustrating illustrative representations of general and detail images; figures 7A and 7B are diagrams illustrating additional illustrative representations of overview and detail images; figures 8A through 8C are tables illustrating representative camera configurations for the two embodiments of illustrative systems for capturing general and detail images; : figure 9 illustrates an aircraft equipped with a computing / processing system; and figure 10 illustrates a block diagram for a notebook / laptop computer working in conjunction with a controller and GPS system as described in one embodiment.
Detailed Description Certain terminology is used in this document for convenience only and is not to be construed as limiting the embodiments of this description.
In the drawings, the same letters and reference numbers are used to designate the same elements for all the various figures.
The words "right", "left", "lower", "upper" designate directions in the drawings to which references are made.
The words "forward", "sideways" refer to the directions of travel of a vehicle, aircraft, spaceship, submarine or other platform that is translated referring to the terrain.
The terminology includes the words specifically mentioned above, derivations thereof and words with a similar meaning.
The term "resolution" when used in this document referring to an image: refers to the ability to distinguish portrayed objects, with resolution typically being provided in cm and with reference to the object (s) on the ground.
When used in this context, resolution can be
. differently called resolution element on the ground, resolution cell, resolution on the ground, pixel resolution on the ground. When - used referring to a camera or other imaging device, the resolution may refer to the pixel density of this imaging device. As will be understood by those skilled in the art, image resolution (resolution element in the field, resolution cell, resolution in the field, or pixel resolution in the field) is dependent on several parameters, including not only the camera resolution, but also other variables including, without limitation, the image generation system (eg lens) and operational conditions (eg altitude) in which images are captured.
: Aerial and satellite images of the earth are used for a wide variety of military, commercial and physical applications. A number of emerging applications include serving photocards on the Internet, and services based on the generation of these photocards (for example, maps and directions, real state values). In general, there is an increasing demand for photocards, and for recently updated photocards. However, existing systems for the generation of photocards often involve excessively complex components, require high capital expenditures, and / or have high operating costs, among other disadvantages. They are not able to produce images within short periods of time and operating regimes, or otherwise provide the high resolution currently desired.
In general, existing photogrammetry imaging solutions fail to meet the growing demand for more timely and higher resolution images due to their inability to capture sufficient amounts of appropriate high resolution data in an efficient manner. In accordance with the principles consistent with certain aspects related to the innovations in this document, the camera systems used for spatial photogrammetry must address two conflicting requirements.
First, it is vital that the camera system's lens and focal system parameters (known as interior orientation), as well as their
. position in space and viewing angle (known as outer orientation) are precisely calculated. A known photogrammetric solution as perspective beams it can be used to calculate the interior and exterior orientation information for the camera and for each photograph taken by the camera. Such calculations are often a prerequisite for allowing individual photos to be merged into continuous photocards. One way to achieve the required level of accuracy is to obtain multiple images, with a large amount of redundant data between the photographs. Common aspects, common elements, common points, or visible image elements - 10 in various photographs can then be identified and used to calculate the camera's interior and exterior parameters. However, even with large: amounts of redundant data between photographs, it can be difficult to identify common points or image elements if the photographs were taken at different times or under different conditions (for example, different altitudes, different times of the day ) since the common points or elements of the image may have moved or may have differences in appearance (for example, different shading due to changes in lighting) that make it difficult to correlate between these points or common image elements.
Second, it is desirable that the survey be completed quickly. This provides several advantages, such as reduced operating costs and minimized delays resulting from unfavorable environmental conditions or topography such a harsh climate. An effective way to increase the amount of land area captured, measured in km per hour is to minimize the amount of redundancy between the high resolution detailed photos that are subsequently used to generate the photocards.
Thus, the desire to increase redundancy between images to allow accurate photogrammetric positioning of images must be balanced with the desire to decrease redundancy between photographs to complete topographies at a lower cost.
The collection of aerial photocarta data can be performed by
. conduct an aircraft equipped with aerial imaging devices (for example, cameras) along with a flight plan that involves flying 7 along a relatively straight path, tilting sideways and turning the aircraft to rotate 180º to fly one parallel return path with some lateral displacements from the original path, and repeating this pattern until a designated area of the terrain has been photographed. As will be understood by those skilled in the art, images or photographs are captured at periodic intervals along the straight part of the flight plan to create photographs with front overlap, and the flight plan is designed so that the captured images have overlap. side by side.
: The overlay on the images can be created by a number of mechanisms. For example, an imaging system that is being moved along a geometry axis or generally moved above ground in a vehicle (for example, an aircraft) can capture images periodically. The time between the captured images (photographs) can be arranged so that the photographs overlap in the direction of travel. The overlap resulting from the forward direction of the course is usually referred to as the overlap. Photographs that are taken one after another in such a system and that have the front overlap mentioned above can be referred to as sequential or adjacent photographs. In flight planes with a forward path and a return path, the lateral overlap is created by spacing the forward path and the return path so that the images captured along these paths have a desired degree of overlap. The overlap resulting from the spacing of the forward and return paths on the flight path is usually referred to as lateral overlap. Finally, imaging systems or cameras can be arranged within an image capture system so that they point to different areas of the terrain below, with the overlap between captured images being created due to the mechanical arrangement of the capture systems. images (for example, cameras).
. Although the amount of front and side overlap may vary from application to application, a common overlap of current aerial mapping systems * is 80/30, indicating - 80% forward overlap with sequential photographs along a flight line and overlapping. side 30% with photographs on adjacent parallel flight lines. In such a configuration, capturing sequential images during forward translation on a flight line would result in only 20% of each image containing new information. Collecting data in this way allows an aspect, image element or common point to be identified within - 10 around 5 images. In terms of redundancy for the example mentioned above, any point, pixel, set of pixels, element, image element, object, or aspect in this common area has a redundancy of 4 (original image plus four more identifiable images from this point or object). Thus, a set of sequential images having an overlap of 80% could be considered as having a redundancy of
4. In general, redundancy can be described as the number of images (in a set of images) in which one point appears on average, less than one. The points that are captured redundantly may or may not be used as image elements, but such points or pixels appear multiple images within the set. As will be understood by those skilled in the art, for high redundancy values, the number of images in which a point appears on average (n), which approaches the redundancy (n -1). The amount of redundant information in the image sets would additionally be increased by the side overlay resulting only around 14% of each image containing new information and around 86% of the image information being redundant in terms of the final photocarta. As will be understood by those skilled in the art, the increasing overlap, whether in the front overlap, side overlap, or overlap generated by other operations or mechanical configurations, will increase the redundancy in the image sets. In one embodiment of the present systems and methods, at least two image generation systems / subsystems are used to
'ra capture overview images and detail images.
In another embodiment, at least two imaging systems / subsystems are * used to capture overview images at a first resolution level, and detail images at a second resolution level, the second resolution level being more refined (more image detail) than the first level of resolution.
As illustrated in figure 1, the detail images 122, 124 and 126, captured by the second system 120, are located partially or completely within the capture area of an overview image 112, captured by the first system 110. The first and second - 10 systems 110 and 120 can be translated, typically along the geometric axis X 115. Figures 5C and 5D illustrate the capture of detailed and detailed images from a plane and along of a typical aerial topography path, respectively.
Images are collected so that there is significant overlap in the overview images, but the overlap in the detail images is significantly reduced or minimized by referring to the amount of overlap in the overview images.
Similarly, the amount of overlapping of the detail images in one or more embodiments of the present systems and methods is highly reduced referring to the images obtained in other traditional systems of image generation.
As there is a significant amount of overlap in the overview images, there is high redundancy in these low resolution images, this redundancy being used for image processing related to the generation of the photocar.
Detail images, which are at the desired resolution for photocards, have a much smaller amount of redundancy, thus reducing the requirements.
storage and processing facilities for these images.
Higher levels of redundancy or overlap increase the ability to precisely calculate the exterior and interior orientation for the camera system.
However, increased redundancy is largely wasted when creating a final photocopy, as significantly more image data is captured than is needed to create the final photocopy.
Excess data collection increases time and cost
. involved in conducting topography. For example, if a traditional aerial imagery system is conducted at an altitude sufficient to produce a photocarta with a 10 cm terrain pixel size using - 80/30 overlay, approximately 100 Terabytes (TB) of image data would have to be collected to generate a final photocarta that is approximately 14 TB in size. Thus, the pixel resolution images of the 10 cm terrain will have a redundancy of around 6 (corresponding to only 14% of new information in each image) and these images will serve both for the calculation of the external orientation. - 10 riore interior of the camera system as well as to generate the final photocarta.
'Alternatively, the use of the present methods and systems would allow the use of a first camera system providing a pixel size of the terrain of 100 cm in a high redundancy (eg 98) with a single very small coverage area per photo (approximately 1%) and a second camera system providing high resolution at 10 cm with a single large area per photo of 80%. Using this technique and system would require around 15 TB for the high redundancy photo set and around 15 TB for the low redundancy photo set, for a total storage requirement of less than 30 TB . In addition, due to the high redundancy (98) in low-resolution photographs, further processing can achieve greater robustness (fewer errors) and greater precision than images with lower redundancy at higher resolution. For example, if the traditional system has an average squared error (RMS) of 0.5 pixels, the absolute terrain error would be 5 cm (0.5 * 10 cm). Using the methods and systems of the invention, photographs with high redundancy can allow further processing of RMS of 0.1 pixels, for an absolute terrain error of 0.1 * 100 cm = cm. This can be further improved by locating the more detailed images within the images with high redundancy, resulting in the ability to obtain absolute terrain error levels that are comparable or less than in previous systems.
- In one embodiment, the methods and systems of the invention employ the use of multiple sets of cameras, each set of cameras * potentially comprising multiple cameras. Thus, the resolution is not limited to that of current camera systems. For example, current camera systems such as those offered by Vexcel corporation may have a resolution of 300 megapixels, but this is achieved through the use of multiple cameras that are mounted on an extremely rigid and pre-calibrated platform. Using the methods and systems of the invention, it is possible to create a virtual camera system with extremely high resolution - for example, 10 gigapixels. Due to demand requirements in relation to "aerial photography, camera systems are typically customized for the particular aerial photography application. Traditional systems do not take advantage of the Commercial Off The Shelf (COTS) components (" Components Ready "), and thus, do not take advantage of the advancement in digital photography easily, such as the relatively low (and continuously decreasing) cost of digital single-lens reflex (D-SLR) cameras. The great weight and high cost of required camera systems using traditional approaches encourage or require the use of dual engine turboprop aircraft, which additionally increases operating costs, as such aircraft are significantly more expensive to operate than aero - common commercial single-engine spacecraft, such as the Cessna 210. In addition, the use of common traditional systems requires custom modifications to the aircraft for mounting the camera. ste, the methods and systems of the invention allow, in certain embodiments, the ability to use single-engine aircraft, having lower operating costs than the dual-engine aircraft, and do not require modification along the aircraft structure.
Using the methods and systems of the invention, digital images of altarsolution can be captured across large areas for airborne and space topographic surveys. Data collection times can be significantly reduced compared to systems
. but current. Thus, capital and operating costs can be reduced, and aerial surveys can be conducted quickly when climate allows. In some embodiments, high-resolution surveys can be captured from high altitudes, thereby reducing the impact on Air Traffic Control, providing smoother flight conditions for the aerial survey crew, and generally reducing the pilot's workload.
Additionally, different types of cameras, or cameras used at different angles, can be used to collect images of - 10 different resolutions and with different degrees of redundancy. For example, in collecting image data for photogrammetry applications, overhead "cameras can be used to collect overview images at a relatively low resolution with a high degree of redundancy, and oblique cameras can be used to collect high resolution data. with a low degree of redundancy. Other combinations of cameras and resolutions / redundancy are possible, both for photogrammetry applications and for other applications. Using the methods and systems of the invention, different types of cameras can be combined to generate nadir photocards, oblique photocards, infrared photocards, or other combinations as dictated by the survey requirements.
Although described in this document as detail and overview camera systems, additional sets of cameras (or other types of image capture devices) can be incorporated to form cascades of image capture systems operating at different resolutions. and different amounts of redundancy. Because there are greater degrees of redundancy in images with lower resolutions than in images with higher resolutions, it is possible to have the appropriate amount of redundancy for image processing (for example, perspective beams, generation of digital elevation map ), while at the same time minimizing the amount of redundancy in the higher resolution images. For example, the method and system described in this document can be used with three sets of cameras, the first set of cameras.
. ras operating at a low resolution with high redundancy, the second set of cameras operating at a medium resolution with an average redundancy, and the third set of cameras operating at a high resolution with low redundancy.
In general, cascading can be performed using multiple sets of cameras that capture images with varying degrees of overlap, resolution and / or redundancy, so that sets resulting from low resolution images have greater redundancy than sets images taken at a higher resolution.
As will be understood by those skilled in the art, all cameras can be extended to n cameras or n sets of cameras, with no limitations on specific physical arrangements.
The cascade of "cameras can produce images with a spectrum of resolutions, consistent with the redundancy being less in images with higher resolution.
A set of cameras, whether organized in a linear fashion, in a matrix (row and column format), or in a hierarchy of enlargements, can be considered to be organized in a cascade way when the result is several captured images having different terrain resolutions.
As an example, a set of four cameras arranged as a matrix can be arranged in a cascade way by capturing images in different terrain resolutions, or in different terrain resolutions with different magnifications.
If the cameras are organized to cover or overlap the terrain areas, there will be image data.
redundant gem between captured images.
As understood by those skilled in the art, after the images have been captured, either through these methods or the methods of the prior art, they can be processed using photographic tools in order to produce a series of applications, such as photocards or digital elevation maps.
Common software programs used for such processing include, but are not limited to, one or more of the following programs: Match-AT triangulation software sold by Inpho Corporation; digital mapping software sold under the trademark Socet Setº by BAE Systemsº; Socet Setº software that is integrated with
. photogrammetric perspective beam software sold as BINGO by GIP mbH; and ERDAS ER Mapper * image processing software sold by ERDASº. In addition, a wide variety of software - image processing and triangulation software sold by various producers can be used to process the data.
The image generation systems / subsystems for capturing general and detail images can be placed together in a vehicle suitable for image capture (for example, aircraft, spacecraft, submarine, balloon) or can be located on platforms different - 10 tes. In various embodiments, the overview and detail imaging systems are located together in an enclosure (for example, compartment) that connects with a small aircraft. In one or more embodiments, the overview and detail images are captured substantially simultaneously. An image capture signal can be generated from the synchronized system / subsystem (for example, a system controller) which facilitates the almost simultaneous capture of the overview and detail images.
In one or more embodiments of the systems and methods of the invention, the overview images are collected so that there is an overlap of sequentially captured overview images (hereinafter referred to as sequential overview images) of greater or equal 50% in the forward direction. In an alternative embodiment, the overlapping of sequential overview images in the forward direction is at least 90%. In one embodiment, the overlapping of sequential detail images in the forward direction is in the range of 0% to 20%. Other embodiments with other combinations of overlap are possible as will be understood by those skilled in the art, and consistent with having the degree of overlap in the sequential images of detail significantly less than the degree of overlap in the sequential overview images .
In one embodiment of the methods and systems of the invention, a first image capture system is used to capture an image.
. overview image of an overview area, while a second image capture system captures, at substantially the same time, * a detail image of at least part of the overview area, EF with redundancy existing between the overview images, and redundancy existing between the detail images.
In terms of redundancy, in one embodiment, the redundancy in the overview images is greater than 10, while the redundancy in the detail images is less than or equal to 10. In another embodiment, the redundancy in the images of detail approaches zero. In yet another * 10 embodiment, the redundancy in the detail images is occasionally less than zero (negative) indicating spaces in the captured images.
'Due to the high redundancy in the overview images, the spaces in the detail images can be recreated or filled through the subsequent image processing.
As will be appreciated by those skilled in the art, the degree of redundancy can be varied depending on the environment or the conditions under which the images are being collected. In environments with poor visibility or with rapid change, the degree of redundancy will need to be extremely high. For example, in fog / dust conditions, or in underwater applications, the solution can be tended to further redundancy. This can be done through several mechanisms including the use of more general view cameras or by having more frequent image capture (uniform video frame rate embedding). In the case of subsea applications, multiple 270º sensors, working very close to the video frequency, could be used to collect overview images with very high redundancy, while a single camera could be used to obtain high resolution images / low redundancy. Conversely, in an environment that changes less over time (for example, viewing an entire planet from space), the degree of redundancy in the overview images could be reduced.
In an application, the overview and detail images are collected simultaneously, thereby ensuring that
Dundant contain a sufficient number of potential common aspects, common elements, common points, or image elements, and minimizing the effects of object movements or changes in lighting. In . another embodiment, the overview and detail images are captured from approximately the same location. In yet another embodiment, the overview and detail images are captured simultaneously from approximately the same location. In one or more embodiments of the systems and methods of the invention, the image capture systems / subsystems use cameras - 10 digital. In one or more embodiments, digital cameras are CMOS-based cameras or sensors. In an alternative embodiment, a 'push broom sensor is used, and in yet another embodiment, a whisk broom sensor is used for image capture. Other mechanisms for image capture of both overview and detail images can be used, including, but not limited to, analog film systems, spot or linear scanners, CCD imaging arrangements, other imaging devices based in III-V or II-VI, ultrasound image generators, infrared (thermographic) image generators. The image generators operate based on the reception of electromagnetic rays and can operate in the infrared, visible or other parts of the electromagnetic spectrum. Large format and multiple lenses, multiple detectors, and multiple detector / lens systems, such as those described in US Patent 7,009,638 to Gruber and others, and in the US Patent
5,757,423 for Tanaka et al., Whose descriptions in their entirety are incorporated into this document by reference, can also be used to capture both overview and detail images. Additionally, multiple image collection systems, such as the Multi-cameras Integrated Digital Acquisition System (MIDAS) offered by TRACK'AIR corporation, and other systems configured to provide detailed metric oblique views can be adopted and incorporated into the methods and systems of the present invention.
In one or more embodiments of the methods and systems of
: vention, a synchronized system / subsystem is used to generate image capture signals that are fed into the * image capture systems / subsystems and cause the capture of the overview and detail images.
In one or more embodiments, the synchronized system / subsystem is based on a microcontroller or microprocessor with appropriate software, firmware and accompanying hardware to generate electronic or optical signals that can be transmitted via cabling or through space (for example, unused for image capture systems / subsystems.
Alternatively, a specialized electronic hardware device, working in conjunction with a navigation system, such as a GPS-based navigation system, or alone, can act as the 'synchronization system / subsystem to generate capture signals. of image.
In one or more embodiments, the image capture signals are generated in a system controller in the form of a computer (for example, laptop or reinforced computer) and are received by digital cameras that form the image generation systems for the overview and detail cameras.
There is inherent distortion in the transmission of signals through cables (typically having different lengths) and inherent delays for digital cameras so that there are variations in the real time of capturing images, despite the use of one or more synchronized image capture signals resulting in substantially simultaneous capture of images.
In one or more embodiments, the image capture signal is a unidirectional signal emanating from the synchronization system / subsystem, and no return signal from the image capture systems / subsystems is required.
Similarly, image capture data can be stored locally on imaging devices (for example, digital cameras) and no image data needs to be returned from the imaging devices to the controller or to other data storage devices.
The data storage used for storing images includes, but is not limited to: solid state memory devices such as flash memory,
. Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM); magnetic storage devices including, but not limited to, tapes, magnetic drums, essential memory, core-string memory, thin-film memory, twistor memory and bo-island memory; electromagnetic storage devices including, but not limited to, hard drives or disk drives and floppy disks; optical storage devices including, but not limited to photographic film, holographic memory devices and holograms, and optical discs; and magneto-optical drives and data storage devices. Figure 10 is a block diagram of an illustrative system 100 consistent with some aspects related to the methods and systems of the invention. Referring to figure 1, system 100 may comprise a first system 10 that acquires at least one overview image 112, and a second system 120 that acquires detail images 122, 124,
126. The system can be oriented in an xy coordinate system as illustrated in Figure 1 and according to the geometric axis x 115 and the geometric axis y 114. In one embodiment, the image capture devices (for example, for example, cameras) are arranged to capture detail images 122, 124, 126 on the strips along a detail geometry axis 130, as a detail geometry axis 130 being generally parallel to the y 114 geometry axis.
Each of the first and second systems 110 and 120 may include one or more image capture devices, for example, cameras (throughout this description, the broad term "image capture device" is often referred to as "camera" convenience, not limitation). In addition, an image generation arrangement can be created through an array of individual sensors that are used to capture an image, and can act as an individual image capture device or camera. Cameras or individual image capture devices can be arranged in a linear layout, arranged along a geometric axis and set at varying angles to capture different areas of the terrain, or arranged in a format
. matrix or arrangement (row and column). When arranged so that image capture devices capture adjacent or nearby * image areas, whether overlapping or not, devices can be considered - as being arranged in an adjacent manner.
In one embodiment, the first system 10 and the second system 120 are translated in an x direction with images captured periodically so that a high degree of overlap is created in the sequential overview images captured from the first system 10, and a lesser degree of overlap is created in the sequential detail images captured by the second system 120. In the various embodiments, the overview images have a lower resolution than the detail images, of in order to produce high redundancy within the overview images without creating unnecessary data storage and processing requirements.
As illustrated in figure 1, due to the physical layout of the imaging systems or cameras, the detail image 122 has some overlap with the detail image 124, and the detail image 124 has some overlap with the detail image 126 in the direction of the geometric axis of details 130. As will be understood by experts in the field, the translation of the first system 110 and the second system 120 along the geometric axis x 115 with periodic image capture allows a strip or strip of terrain be represented in the detail images 122, 124 and 126, with the overlay ensuring that the detail images capture a contiguous strip corresponding to a strip of terrain.
The movement or translation of the first system 10 and the second system 120 together with the periodic image capture results in the capture of contiguous strips / strips having a first degree of overlap in front of the detail image level, and capture of the images. overview images having a second degree of overlapping forward, the second degree of overlap being higher than the first degree of overlap.
In alternative embodiments, the first system 110 and the second system 120 are translated along the y 114 axis.
. In yet another embodiment, the first system 110 is transferred separately from the second system 120. In yet another embodiment, the image: overview 112 and detail images 122, 124 and 126 are captured - at separate times from the first system 10 and the second system 120, respectively.
In addition, the first and second systems 110, 120 may include arrangements of digital image capture devices, such as cascading or adjacent groups of multiple cameras mounted on rigid or semi-rigid supports. Those skilled in the art will appreciate that - 10 such assembly details are illustrative. For example, the term rigid or semi-rigid mounting system can describe any device capable of precisely defining the relative position of an image generation system such as a single camera or multiple cameras. Such a mounting system can be constructed in several ways. For example, the mounting system can be comprised of a rigid structure, such as mounting the cameras in a compartment lock; it can comprise cameras held in independent, yet accurate, positions relative to each other, such as cameras mounted on different aerial or satellite multipole systems with a local reference system to define the relative positioning of the camera between satellites . Alternatively, the first system 110 may consist of a low resolution imaging arrangement, and the second system 120 may consist of one or more high resolution imaging arrangements, with the arrangement and geometry imaging the selected arrangements so that the low resolution image generation arrangement of the first system 110 captures the overview image 112, and the high resolution image generation arrangements capture detail images 122, 124 and 126. The system 100 in figure 1 is also illustrative, referring to the various configurations that may be present between or in the middle of systems 110, 120 and / or their image capture devices. For example, figures 2A and 2B are block diagrams illustrating different arrangements of the first system 10 and the second system 120 consistent with the
. methods and systems revealed in this document. In both figures 2A and 2B, the imaging systems 210A and 220A are used with the first system 110 and the second system 120, respectively. Figure 2A illustrates an implementation where the first system 10 and the second system 120 are located at a fixed location, such as on an aerial platform, inside or on an aircraft including, without limitation, a fixed-wing aircraft or helicopter , on a satellite, on a high-altitude observation platform or in space, or inside or on a navigation vessel in the ocean, such as a ship, submarine, or other - 10 underwater vessel. In this embodiment, the first system 10 and the second system 120 are located close to each other and are moved together. In other applications, the first system 110 and the second system 120 closely located are used for observations of the terrain, observations of the earth and sky, generation of underwater images, or generation of microscopic images.
Figure 2B illustrates an embodiment where the first system 110 is positioned separately from the second system 120. In this embodiment, the first and second systems are kept independent, but the locations of the two (or more) systems in relation to each other are precisely known or calculated. In a physical structure, this can be accomplished through rigid assembly such as a compartment lock. Alternatively, tracking the relative position between the first system 110 and the second system 120 will allow the use of two completely independent platforms. In one embodiment, a first aircraft or other type of vehicle can create overview images using the first system 10, while a second aircraft or other type of vehicle can create detail images using the second system 120. Navigational guidance systems or inertials can be used to determine the relative positioning of the systems. In yet another embodiment, the systems are mounted on multiple distinct satellite systems with a local reference system used to define the relative positioning of the camera between satellites.
'Figure 3 is a block diagram of another illustrative system consistent with some aspects related to the innovations in this document.
As shown in figure 3, a unitary platform or module 310 can include or incorporate both the first system 110 and the second system 120. The unitary platform can be any arrangement or configuration in which the first system and the second system are connected fixed shape and can be moved or moved together.
According to additional implementations, the 310 platform may also include various arrangements and arrangements of the first and second image capture devices or - 10 cameras.
Referring to figure 3, the image capture systems 210A and 210A 'represent the first image generation systems that captured general view images in a first resolution.
The number of image capture systems that capture overview images in a first resolution can be extended as illustrated by the 210A "image capture system, and as such, multiple cameras or other imaging devices can be used to create the overview image 112. In one embodiment, each of the first image generation systems 210A, 210A 'through 210AY "is used to obtain the complete overview image 112, while in an alternative embodiment , the first image generation systems 210A, 210A 'through 210AY are arranged to obtain segments of the overview image 112 and as such, support the assembly of an entire overview image.
In one embodiment, the first imaging systems 210A, 210A 'through 210AV are arranged along the geometric axis of details 130. In alternative embodiments, the first imaging systems 210A, 210A' through 210AY ”are arranged - items along the geometric axis x 115, in an arrangement format, or in any other arrangement that provides coverage of the overview area to be captured in the 112 overview image. As previously discussed, the arrangements and / or arrangements image generation devices can be configured to create a cascade of image generation systems producing a spectrum of resolutions with redundancy generally different.
decreasing with increasing resolution.
. Again referring to figure 3, the detail images 122, 124 and 126, having a higher resolution than the overview image 12, are captured with the second image generation systems 220A, 220A 'and 220AY, respectively. In one embodiment, the detail images 122, 124 and 126 are overlapping detail images aligned along the detail geometric axis 130, the detail geometric axis 130 being substantially parallel to the y 114 geometric axis. In or - In other embodiments, the second image generation systems 220A, 220A 'and 220AY are all arranged along the geometric axis x 115, in - 10 an arrangement format, or in any other overlapping or non-overlapping format that allow the capture of detail images: such as detail images 122, 124 and 126.
In one embodiment, the first 210A, 210A 'through 210AY imaging systems and the second 7220A, 220A'and 220AY imaging systems are all based on the same type of imaging system, such as a camera operating in the visible part of the spectrum. In an alternative embodiment, the individual imaging systems within the first 210A, 210A 'through 210AY imaging systems and the second 220A, 220A'e220AY imaging systems are different. For example, the first 220A imaging system can operate in the visible region of the spectrum, while the second 220A 'imaging system can operate in the infrared part of the spectrum. Similarly, the second imaging systems 220A, 220A 'and 220AY can be of different types (for example, visible and infrared) and can be arranged so that the detail image 122 is captured twice or more, once for each of the two or more imaging systems. As will be understood by those skilled in the art, detail images 122, 124 and 126 can be captured by multiple types of image generation systems (for example, visible or infrared), or with each detail image being captured by a unique type of imaging system. Referring to figure 4, a unit module 400 is revealed, including
. including a first overview camera 410A subtending a first view of the overview camera 411A, a second view camera * (not shown in figure 4) subtending a second camera view. overview 41B, a first detail camera 420A subtending a first detail view 421A, a second detail camera 420B comprising a second view of detail camera 421B, a third detail camera 420C subtending a third view of detail camera 421C, a fourth detail camera 420D subtending a fourth view of detail camera 421D, a fifth detail camera 420E understands - 10 giving a fifth view of detail camera 421E, a sixth detail camera 420F subtending a sixth 421F detail camera view, a 'seventh 420G detail camera subtending a seventh 421G detail camera, an eighth 420H detail camera subtending an eighth 421H detail camera, a ninth 4201 detail camera | subtending a ninth 4211 detail camera view, a 420J tenth detail camera subtending a tenth 421J detail camera view and an eleventh 420K detail camera subtending a eleventh 421K detail camera view.
Local data storage can be used with each camera, thus eliminating the need to save back to a memory or central storage location.
Local data storage can be comprised of any type of digital memory including, but not limited to, flash memory or other non-volatile memory, volatile memory and associated systems for retaining information in this memory, disk drives, or others types of media or digital storage systems.
Alternatively, cameras can share local memory.
Referring to the latter case, some of the innovations in this document include aspects of compressing and / or storing images in association with each camera, rather than requiring that captured photographs be transmitted and stored in a central storage system.
Parallel photo compression and storage with each camera increases maximum throughput and storage for the camera system, which allows surveys
'be driven faster, allowing more data to be stored and flight times to be increased. Such parallel compression and storage * with each camera also increases the reliability - of storage, as this allows the use of a compact flash medium or solid-state drive with each camera.
The existing digital imaging systems store the raw linear sensor as 12- to 16-bit data next to a central storage system. In contrast, by performing compression on each camera in parallel, the data can be converted to a space - 10 decoresgame such as YCbCr. This allows data to be stored as 8-bit data, since bit intensity is typically only required for raw linear data, and additionally allows for image compression prior to storage in each data store. camera. Conversion to a gamma color space and compression can allow for a 10-fold reduction in storage space requirements. For example, in a system with 14 cameras, each with its own 32 GB compressed flash memory card, the total of 448 GB of storage can be equivalent to more than around 4,500 GB or 4.5 TB of storage of uncompressed raw photo data. Parallel operation eliminates the need to transmit image data or any other signals from the cameras to the flight control computer system, and thus increases the capture rate for the camera system, thereby reducing cabling and signaling requirements.
A flight plan and image capture synchronization subsystem can be used to generate one or more capture signals to be sent to the cameras as shown in Figure 4. In one embodiment, a single capture signal is sent from the flight plan and image capture synchronization subsystem for each camera. However, differences in cable lengths, camera delays, and other variables can result in photographs being taken with slightly different times. Additionally, generators
. local camera timing signals may be inaccurate or fluctuate. . In one embodiment, digital cameras, typically containing three CMOS image generation sensor arrays, are used to capture general and detail views.
In an alternative embodiment, push broom sensors, comprised of a linear array of optimal sensors, can be used to capture detail images and serve as the detail image capture system.
In another embodiment, a whisk broom or focus sensor can be used to generate the detail images.
When using a whisk broom sensor, a mirror-based scanning system or another type of scanning system: create the image by generating the image from a single point on the sensor.
The digitization system can be integrated with the synchronization and navigational systems so that the digitization rate is appropriately synchronized with the forward movement of the vehicle transporting the camera systems and create the detail image with the appropriate resolution .
Those skilled in the art would recognize that the quantities (that is, both of the cameras and of the arrangements) of detail cameras can be adjusted to provide the desired image results.
Advantages consistent with such implementations include the ability to configure and / or reconfigure module 400 to target different survey requirements, such as collecting vertical or oblique (high and low) images, or combinations of them.
As understood by those skilled in the art, vertical images or photographs are these taken with the geo-metric axis of the camera directed as vertically as possible, whereas oblique images or photographs refer to these images or photographs taken with the camera's geometric axis intentionally. tilted away from the vertical.
In addition, those skilled in the art will understand that high oblique images or photographs generally include the horizon, while low oblique images or photographs generally do not include the horizon.
Referring to figure 4, the various cameras can be arranged
BR on unit module 400 so that the cameras are generally aligned along the geometric axis of module 450. In one embodiment, the: geometric axis of module 450 is substantially parallel to the geometric axis x 115 in Figure 3, which is typically in the direction of travel ahead of the aircraft or other vehicle.
In this embodiment, the geometrical detail axis 130 (not shown in figure 4) is substantially perpendicular to the geometric axis of module 450 and the detail cameras are arranged to create an image generation range that is substantially
parallel to the geometric axis y 114 of figure 3. - 10 Figures 8A and 8B provide examples of the details of camera arrangements that can be used in one embodiment. 'The specific examples disclosed in this document are not to be considered as limiting and in no way restrict the use of the methods and systems disclosed in this document, which can be applied to various types and configurations of imaging systems.
For example, although illustrative layout details refer to Canon or Nikon equipment, other types of imaging equipment or combinations of them, or different combinations of camera groups, layouts, or lenses, can be used .
In one embodiment, the cameras are grouped so that the overview cameras (Canon or Nikon brand cameras) are comprised of a vertical overview camera in the form of a camera with a 28 mm lens pointing vertically downwards , as mentioned in Table 1 of Figure 8A, and a rear view camera with a 28 mm lens pointing backwards (or opposite the direction of movement of the aircraft or other vehicle) at a 35 degree angle from from the vertical.
In this embodiment, Canon high-resolution cameras are comprised of a vertical group of five cameras with 200 mm lenses and with a group spacing of -19º, -9.5º, 0º, 9.5º, 19º and 28, 5th; a lateral oblique group is comprised of three cameras having 200 mm lenses and a group spacing of 38º, 47.5º, 57º; and a rear oblique group comprising of three cameras with 135 mm lenses with a group spacing of -
'14.5º, 0º, 14.5º inclined 50º from the vertical. In the case of Nikon high resolution cameras, a vertical group of 6 cameras with - 180 mm lenses has a group spacing of -21º, -10.5º, 0º, 10.5º, 21º, 31.5º; a lateral oblique group of 3 cameras having 180 mm lenses having a group spacing of 42º, 52.5º and 63º; and a rear oblique group of 3 cameras having 135 mm lenses with a group spacing of -14.5º, 0º, 14.5º inclined 50º from the vertical. In an alternative embodiment, a first set of cameras is configured with wide-angle lenses and is used to capture photographs with a very large amount of overlap such as 50/99 (50% sideways and 99% forward) ). Photographs captured by these cameras: cover a large area per photograph, and the high degree of overlap and redundancy results in common aspects, common elements, common points, or points of image elements being visible in many more photographs than in previous systems, thus, allowing the precise determination of interior and exterior orientation even without the use of a stabilized platform. A second set of cameras can be configured with longer focal length lenses and used to capture detail images to generate the detailed photocards for the survey. A low amount of overlap is used in these cameras to minimize redundancy and to maximize the use of photography images for surveying in detail, significantly reducing overall costs and the time required to complete the survey.
Figure 5A illustrates an illustrative implementation including an external compartment mounted on a small, single-engine aircraft
510. Referring to figure 5A, in one embodiment of the invention, the cameras for the camera system are mounted inside a compartment or removable housing 520, which serves as the unit module 400. Thus, it is possible to use the camera system on a small standard 510 aircraft, such as a Cessna 210, without requiring modifications to the aircraft. Figure 5B illustrates an illustrative implementation of an image capture system. As shown in figure 5B, the compartment or room
. removable frame 520 can include several overview and detail cameras 410 and 420, which can be grouped or arranged as previously described: referring to figures 4, 8A and 8B.
Implementations such as - shown in figures 5A and 5B provide high accuracy without requiring a stabilized mounting platform, and also allow for sufficient weight and reduction in size allowing the camera system to be mounted in an Unmanned Aerial Vehicle (UAV). Aerial surveys can be carried out at different altitudes and with different flight times, with different resulting resolutions.
For example, and according to the camera configurations illustrated in figures 8A and 8B, an aerial survey performed with a Canon 1Ds MKIII vertical camera with a lens with a focal length of 200 mm at an altitude of 8,000 feet, can generate data for a final photo with a resolution of 7.5 cm.
In this example, with a capture rate of 330 km / h, a typical city of 50 km x 40 km can be captured in a flight time of 6 hours.
In another embodiment, in correspondence with the camera configurations illustrated in figure 8C, with 1 Canon 1Ds MKIll vertical overview camera with a 28 mm focal length lens and 9 Canon 1Ds MKII detail cameras! with lenses with a focal length of 300 mm and at an altitude of 10,000 feet, a capture rate of 500 km / h can be achieved, resulting in a flight time of 4 hours to capture a typical city 50 km x 40 km with a resulting resolution of 6.5 cm.
Higher resolutions can be captured using the same embodiments discussed above, or, in other embodiments, by using longer flight times (for example, 3.5 cm resolution captured in a 9-hour aerial survey) in lower altitudes.
The aerial surveys mentioned above are only representative examples and not given to limit the scope of the invention, which can be practiced under a wide variety of conditions.
For subsea applications, the altitude can be understood as being comparable to the distance above the seabed.
. As will be appreciated by those skilled in the art, various configurations of imaging systems can be used with different relationships between altitude and resolution, all of these configurations are stable. within the spirit and scope of the invention. In one embodiment, the 1 cm resolution is produced for every 1,000 feet of altitude (for example, the 3 cm resolution at 3,000 feet of altitude, 7 cm resolution at
7,000 feet of altitude). In a second embodiment, the point resolution of the terrain in cm is the altitude in feet divided by 900. In a third embodiment, the point resolution of the terrain in cm is the altitude in feet - 10 divided by 800, and in a fourth embodiment, the resolution on the ground in cm is the altitude in feet divided by 2,000.
: Referring to figure 5C, the use of the method and system in one embodiment is illustrated, in which the 510 aircraft is equipped with the removable compartment or housing 520 and travels at a given altitude h 530 (represented along the axis geometry z 117), at a speed v 532, the course is generally executed in the xy plane as defined by the geometry axis x 115 and the y axis y 114. Figure 5D is a flight plane for a survey in the xy plane, flight path having a first long segment 560, followed by a turn 564, followed by the long return segment 568. Repeated combinations of long segments, curves, and long return segments can be used to create the flight plan for the lifting area. The method and system described in this document may also incorporate a flight plan and synchronization system / subsystem that generates a flight plan suitable for generating a photocarta of a particular area, as well as for capturing signals indicating to the capture systems. - image of general and detail image that the respective images must be captured. In one embodiment, the flight plan contains parameters such as altitude, course direction, airspeed, waypoints and direction change locations. As will be understood by those skilled in the art, the flight plan directs the pilot (or vehicle in the case of
. an unmanned or automatically controlled aircraft) to fly in a pattern that allows the creation of images having the appropriate degree of lateral overlap. Although the overlap in the forward direction is controlled by the synchronization of the image capture signals, the overlap in the lateral direction is controlled mainly by the air vehicle path in relation to the previous parallel paths in the flight.
In one embodiment, the flight plan and sync system / subsystem receives input signals from the navigation equipment, including terrestrial systems (eg, VOR, LORAN) and satellite systems (eg, GPS) and WAAS) to determine the position. Signals generated from inertial systems can be used in conjunction with Ú the location determination signals to determine changes in speed as well as changes in pitching, yawing and rolling of the aircraft. In one embodiment, rapid changes in direction can be determined using Microelectric-Mechanical Systems (MEMS). Both short-term and long-term deviations from the proposed flight plan can be incorporated by the flight plan and image capture system to indicate corrections to the flight plan or to adjust the capture signals being sent for image capture systems of overview and detail.
In one embodiment, the flight plan and image capture synchronization subsystem is based on a personal computer with additional navigational equipment (for example, GPS, D-GPS), displays and programming that allow for a flight plan. flight to be developed and synchronization signals to capture an image consistent with the desired overlap. In an alternative embodiment, specialized hardware is used to develop the flight plan and to generate the image capture signal.
Figures 6A and 6B are diagrams illustrating illustrative representations of general and detail images. Figure 6A presents an illustrative representation, where multiple cameras are configured to maximize the amount of detail image data 610 obtained in the
. single area (not overlapping) through the use of multiple detail cameras, while ensuring that there is sufficient overlap between the 612 overview images to create the desired redundancy to enable successful processing in photocards.
The representation of figure 6A can be achieved, for example, using an overview camera (see, for example, the representative images 612, 616, 620, 624 of the same) to capture interior and exterior orientation, and a group of nine cameras arranged in an adjacent manner to capture a range of 610, 614, 618, 622 or 6 detail photographs - 10 subparts of each general view photograph in higher resolution than the general view resolution. As discussed above, aspects of the innovations in this document may include fixed or partially adjustable camera alignment in the camera system, which allows photographs to be taken with minimal overlap between the detail images forming the strip. Additionally, images can be obtained frequently enough to ensure that there is overlap between sequential images taken along a flight line, and flight lines can be arranged to ensure that there is overlap between bands of detail images obtained over a adjacent flight lines. Unlike existing systems, where significant overlap is required to execute the precise perspective beams, the innovations of the invention allow the use of a minimum amount of overlap existing between subsequent, sequential or adjacent bands of detail images, which need only be enough to subsequently perform the creation of a photocar with no breaks. As a result, the redundancy required for a range of photographs from detail cameras is much less than with existing systems, which significantly reduces survey time and costs. In addition, as many additional detail cameras as required can be configured in an adjacent or cascading way to capture detailed subparts of the overview images for specific views, such as aerial nadir (vertical) images or images oblique from different viewing angles. These images can be
. hotly processed to produce the corresponding nadir aerial photocards or oblique photocards. Due to a single de- cording camera: it may not have enough resolution to capture a subpart in resolution - enough solution for the desired survey, a group of detail cameras for a specific view perspective can be arranged in a range to capture a wider range of the desired perspective. Figures 7A and 7B illustrate additional illustrative representations of the overview and detail image. Figure 7A illustrates the results of three adjacent groups of detail cameras in which five cameras produce images - 10 corresponding to the detailed vertical view (for example, images 730, 730A through E), four cameras produce images corresponding to the oblique views. Single detailed right and left from alternative flight lines (for example, 740 images), and three cameras produce images corresponding to detailed oblique front and rear views from alternative flight lines (for example, 750 images, 750A to 750C). Figure 7B illustrates image capture through movement of the vehicle or aircraft, where multiple oblique views are provided by flying through flight lines in alternative directions, for example, by obtaining four oblique views from two groups of oblique cameras.
As previously discussed with reference to figures 8A, 8B and 8C, particular types of cameras can be geometrically arranged to obtain the imaging configurations illustrated in figures 7A and 7B. Those skilled in the art will be able to determine alternative configurations for these revealed in this document for aerial data capture from various airborne vehicles or aircraft, or in the case of seabed mapping, from offshore vessels. .
Images collected using the method and system of the invention overlap with each other, resulting in the appearance of common points for the two or more images or photographs. Such points can be referred to as common aspects, common elements, common points, or image elements, terrain points, aspect points, terrain aspect points, mooring points, stereo pair or other terms if
BR referring to the repeated appearance of a point or object in several images.
In some cases, points may contain objects with known locations: these objects are commonly referred to as control points.
Common points can be used to develop an appropriate analytical stereo model through the steps of interior orientation, relative orientation and absolute orientation.
The interior orientation generally recreates the geometry that existed on the camera (or another image generation system) when the image or photograph was taken.
The analytical relative orientation is the process of determining the relative angular attitude and the position shift between the - 10 photographs that existed when the photographs were obtained.
The process of stereo analytical absolute orientation results in relating the coordinates: of the control points to their three-dimensional coordinates in a system.
based on the soil.
Generally speaking, given a set of images representing a series of points from different points of view, the traditional perspective beam process can be used to adjust all photogrammetric physicians to support control values (points on the ground or common points) ) in a single solution.
Perspective beams can include determining the spatial coordinates of the object from all points on the object, and the outer orientation parameters of all photographs.
The perspective beams simultaneously refine the estimates for the positions of the point of the terrain and for each exterior and interior orientation of the photographs.
A position of the point on the ground is identified as an aspect in each photograph.
A requirement for perspective beams is to maximize the average and maximum number of photographs in which a point on the ground can be identified.
If a point on the ground is identified in very few photos, then the solution is not very rigid and suffers from both precision errors and an increased risk of serious errors, where incorrectly identified points on the ground were used in the solution of perspective beams.
Perspective beams are able to refine photographs that have different poses, for example, the photographs may have different oblique angles or may be vertically oriented.
Information
'additional referring to perspective beams is known to those skilled in the art and found in references such as "Elements of Photogram- * metry with Applications in GIS, 3rd edition", by Paul Wolf and Bom Dewitt - (McGraw Hill, 2000), US patent 6,996,254 to Zhang et al., And "Bundle adjustment - a modern synthesis" by Bill Triggs, Phillip McLauchlan, Richard Hartley and Andrew Fitzgibbon, appearing in Lectures Notes in Computer Science, vol. 1882 (Springer Verlag, January 2000), all of which are incorporated into this document by reference. In one embodiment, an image capture system is mounted on or in an aircraft to obtain the appropriate raw images using the methods and systems described in this document and to guide the aircraft pilot to the correct coordinates. Figure 9 illustrates an illustrative aircraft equipped with the necessary equipment according to this embodiment. The 510 aircraft is prepared with the removable compartment or housing 520 which is rigidly mounted close to the aircraft
510. In one embodiment, assembly is performed by removing the passenger side door from the aircraft, and replacing the door with the door / compartment support. The removable compartment or housing 520 contains several cameras as described above referring to figure 4. In one embodiment, a series of movable doors cover the cameras in the 520 removable compartment or housing to protect the cameras during parts of the flight including takeoff and the landing. In one embodiment, sensors are incorporated inside the camera doors, so that the condition of the door can be monitored. In one embodiment, cameras and doors in the compartment or removable housing 520 are connected with a computer 1000. In this embodiment, the computer 1000 runs software developed to control and operate the elements of the system during flight. Although represented as a laptop, computer 1000 can be any computer including a laptop, a reinforced personal computer, a system embedded in the aircraft, a specialized computer, a portable device such as a Personal Digital Assistant or cell phone.
: Again referring to figure 9, computer 1000 is connected with a Global Positioning System (GPS): 1010 unit, which produces a feed to track the current position of the plane - and the record of the current position in storage on computer 1000. The camera control unit 1030 controls the arrangement of cameras in the compartment or removable housing 520, including sending signals for autofocus and taking photographs.
In the embodiment illustrated in figure 10, the GPS unit 1010 serves as a navigation system / subsystem, while the computer 1000 serves as a synchronization system / subsystem.
In an alternative embodiment, the computer 1000 incorporates the functionality of the navigation system and may include the GPS unit 1010. Ú In yet another embodiment, a dedicated unit has subsystems providing navigation and synchronization functions.
Flight display 1020 is connected to computer 1000 and in one embodiment displays flight details.
In an alternative embodiment, the flight display 10209 shows the condition of the system as a whole including the condition of the doors and the activity of the cameras when acquiring images.
The flight display 1020 can be the monitor of the personal computer 1000, an additional external monitor or a monitor built into the aircraft.
The flight display 1020 can be a touch sensitive monitor and allows commands to be entered into the system.
Alternatively, a mouse, keyboard or other input device (not shown) can be used to receive user input.
In one embodiment, the system displays a variety of information for the pilot of the 510 aircraft. This information can be displayed on the flight display 1020, on the computer video 1000 or on another display available to the pilot.
The system displays flight lines from a projected area, defined geographic areas and survey data that define the actual area within the map to be captured.
Figure 10 illustrates a block diagram for computer 1000 working in conjunction with a controller 1120 and a GPS device 1122. In one embodiment, computer 1000 includes at least
. a Universal Serial Bus (USB) 1100 port that connects with a USB 1112 hub. Does the USB 1112 hub have multiple ports Additional USB ports that allow devices to be connected and - communicate with the 1100 computer. The USB port 114 is connected with a 1120 controller. As will be understood by those skilled in the art, other types of bus, wired, without using wires, serial or parallel, can be used to interconnect the components of figure 10. In one embodiment, the 1120 controller is a camera control unit (for example, 1030 camera control unit) and controls the camera - 10 ra (s) in the removable compartment or housing 520, using the 1130 autofocus control and the 1132 shutter control on the camera (s). Controller 1120 also reads from a door sensor 1134 to determine whether the doors protecting the cameras in the 520 removable compartment or housing are open or closed. The port can be opened or closed as appropriate in response to controller 1120 by reading the sensor from port 1134. GPS device 112 is connected to USB hub 112 via USB ports 1116, 118. GPS device 1122 reads the current geographical location of the device and transmits this data to the computer 1000. The 1120 controller is capable of sending a signal causing a photograph to be taken from the removable housing
520.
The embodiments of the present description can be implemented with any combination and hardware and software. If implemented as a computer-implemented device, the present description is implemented using the device to perform all the steps and functions described above.
The embodiments of the present description can be included in a manufacturing article (for example, one or more computer program products) having, for example, a usable or computer-readable medium. The medium has incorporated in it, for example, computer-readable program code devices, including computer-executable instructions, to provide and facilitate the mechanisms of
. of this description.
The article of manufacture may be included as part of a computer system or sold separately. - While specific embodiments have been described in detail in the previous detailed description and illustrated in the accompanying drawings, it will be appreciated by those skilled in the art that various modifications and alternatives to these details could be developed in accordance with the general instructions of the description and with the broad concepts of the invention.
Therefore, it is understood that the scope of the present description is not limited to the particular examples and implementations disclosed in this document, but it is intended that modifications are covered within the spirit and scope of the same as defined by the appended claims and Ú by anyone and all their equivalents.
权利要求:
Claims (39)
[1]
1. Image capture system, comprising: a first image capture subsystem comprising. a first imaging device configured to capture, at a first moment in time, at least one overview image of an overview area; and a second image capture subsystem comprising a second image generation device for capturing, substantially simultaneously with the first moment in time, at least. 10 a detail image of at least part of the overview area, the first and second image capture subsystems being configured so that multiple overview images result in a redundancy of elements overview of image among the various images of overview and several images of detail in a redundancy of detail of image elements among the various images of detail.
[2]
2. Image capture system, according to claim 1, in which the first image capture subsystem includes several image generation devices arranged to capture the various images from the general view.
[3]
3. Image capture system, according to claim 2, in which the various imaging devices are arranged to substantially simultaneously capture the various images from the general view.
[4]
4. Image capture system, according to claim 2, in which the various image generation devices are arranged in an adjacent manner to capture the various images from the general view.
[5]
5. Image capture system, according to claim 1, in which the first and second image capture subsystems are located in close proximity to each other.
[6]
6. Image capture system according to claim 1, in which the redundancies represent the degree to which the image elements appear in several images.
[7]
. 7. Image capture system, according to claim 6, in which the image elements include one or more aspects, * identifiable areas or markings corresponding to the area captured in at least one overview image and at least minus a detail image.
[8]
8. Image capture system, according to claim 1, in which the redundancy of the overview is greater than 10.
[9]
9. Image capture system, according to claim 1, in which the detail redundancy is less than or equal to 10.
[10]
10. Image capture system, comprising: - 10 a first image capture subsystem comprising a first image generation device configured to capture, Ú in a first moment in time, at least one overview image of an area of image overview in a first resolution on the ground; and a second image capture subsystem comprising a second imaging device configured to capture, substantially simultaneously with the first moment in time, at least one detail image of at least part of the overview area, at least one detail image being in a second resolution on the ground that is higher than the first resolution on the ground.
[11]
11. Image capture system according to claim 10, in which the first and second image capture subsystems are configured in such a way that several overview images result in a redundancy of image elements between the various images overview and multiple detail images result in a redundancy of image elements between the various detail images.
[12]
12. Image capture system, according to claim 10, in which the first image capture subsystem includes several image generation devices arranged to capture several overview images from one or more view areas general.
[13]
13. Image capture system, according to claim 12, in which the various imaging devices are arranged
[14]
: to capture the various overview images substantially simultaneously. : 14. Image capture system, according to claim 12, in which the various imaging devices are arranged in an adjacent manner to capture the various images from the general view.
[15]
15. Image capture system, according to claim 12, in which the first several image generation devices are arranged in a cascade way to capture the various images - 10 devistageral.
[16]
16. Image capture system, according to claim 10, wherein the second image capture subsystem includes several image capture devices arranged to capture several detail images of one or more areas of detail .
[17]
17. Image capture system, according to claim 16, in which the various imaging devices are arranged to capture the various images of detail substantially simultaneously.
[18]
18. Image capture system, according to claim 16, in which the second several imaging devices are arranged in an adjacent manner to capture the various detailed images.
[19]
19. The method of claim 16, wherein the various imaging devices are arranged in a way that they can capture the various detailed images.
[20]
20. Image capture system, according to claim 10, in which the first and second image capture subsystems are located in close proximity to each other.
[21]
21. Image capture system, according to claim 10, in which the first and second image capture subsystems are mounted inside or on an aircraft.
[22]
22. Image capture system, according to the claim
[23]
: tion 21, in which the first and second image capture subsystems are arranged inside an aircraft housing. * 23. Image capture system, according to claim 22, in which the housing is connected so that it can be removed with the aircraft.
[24]
24. The system of claim 10, wherein the at least one second imaging device is a digital camera.
[25]
25. The system of claim 10, wherein the at least one second imaging device is a CMOS sensor.
[26]
. 26. The system of claim 10, wherein the at least one second imaging device is a push: broom sensor.
[27]
27. The system of claim 10, wherein the at least one second imaging device is a whisk broom sensor.
[28]
28. The system of claim 10, wherein the general view images are stored locally within the first image capture subsystem.
[29]
29. The system of claim 10, wherein the detail images are stored locally within the second image capture subsystem.
[30]
30. Method for capturing images, the method comprising: (a) capturing, by a first image capture subsystem, a first overview image of an overview area, the first overview image being in a first resolution at ground; and (b) capture, by a second image capture subsystem, substantially simultaneously with the capture of the first overview image, a first detail image of at least part of the overview area, the first detail image being in a second resolution on the ground that is higher than the first resolution on the ground.
[31]
31. The method of claim 30, which further comprises:
. (c) transfer the first and second image capture subsystems along a first geometric axis; : (d) capture, by the first image capture subsystem, 'a second overview image of a second overview area in the first resolution on the ground, where the first and second overview images have at least part overlay overview referring to each other; and (e) capture, by the second image capture subsystem, a second detail image in a second resolution in the field, in which the first and second detail images have at least part of the overlapping detail referring to each other which is substantially smaller than the overlay overview part of the first and second overview images.
[32]
32. The method of claim 31, wherein at least one of the overlapping overview part and the overlapping detail part results in an overview redundancy referring to the first and second images of overview and in a detail redundancy referring to the first and second detail images.
[33]
33. The method of claim 32, wherein the redundancies represent the degree to which the image elements appear in the multiple images.
[34]
34. The method of claim 33, wherein the image elements include one or more identifiable aspects, areas or markings corresponding to an area captured in the overview image and —napelomenos a detail image.
[35]
35. The method of claim 32, wherein the redundancy of the overview is greater than 10.
[36]
36. The method of claim 32, wherein the redundancy of the detail is less than or equal to 10.
[37]
37. The method of claim 32, wherein the redundancy of the overview is greater than 10 and the redundancy of the detail is less than or equal to 10.
[38]
: 38. The method of claim 31, wherein the area of at least one part of the overlap overview along the first * geometry axis is greater than or equal to 50% of the area of one of the first and - second overview images.
[39]
39. The method of claim 31, wherein the area of at least a portion of the overlapping detail along the first geometric axis is less than or equal to 20% of the area of one of the first and second detail images.
类似技术:
公开号 | 公开日 | 专利标题
JP6321077B2|2018-05-09|System and method for capturing large area images in detail including cascaded cameras and / or calibration features
US5596494A|1997-01-21|Method and apparatus for acquiring digital maps
KR20090064679A|2009-06-22|Method and apparatus of digital photogrammetry by integrated modeling for different types of sensors
JP2011523033A|2011-08-04|System and method for capturing large area images in detail including vertically connected cameras and / or distance measuring features
CN101919235A|2010-12-15|Orthophotographic image creating method and imaging device
CN102735216B|2016-01-27|CCD stereoscopic camera three-line imagery data adjustment processing method
JP2004127322A|2004-04-22|Stereo image forming method and apparatus
KR100881346B1|2009-02-04|Drawing system for composing the datum point of an aerial photograph taken by digital picture
RU2723239C1|2020-06-09|System for realistic model of terrain for virtual world and method of operation thereof
AU2015203017B2|2017-06-08|Systems and Methods of Capturing Large Area Images in Detail Including Cascaded Cameras and/or Calibration Features
CN108981706A|2018-12-11|Unmanned plane path generating method, device, computer equipment and storage medium
TADDIA et al.2019|CHANNELS’SHAPE EVOLUTION DETECTED BY UAVs IN A RESTORED SALT MARSH
Jia et al.2013|Modeling image motion in airborne three-line-array | push-broom cameras
KR102212452B1|2021-02-04|Aerial Image Processing Method and System Thereof
WO2019080768A1|2019-05-02|Information processing apparatus, aerial photography path generation method, program and recording medium
Mueller2014|LiDAR and image point cloud comparison
Gomarasca2009|Elements of Photogrammetry
Dowman1985|Images from space: the future for satellite photogrammetry
CN113129422A|2021-07-16|Three-dimensional model construction method and device, storage medium and computer equipment
Cosido et al.2015|Representation of the Santander cathedral by combination of different smart techniques
Pieczonka2017|State of art spatial data measurements methods for civil engineering structures
Petrie1999|High resolution space imagery
Granshaw2017|Photogrammetry and remote sensing
CN112461204A|2021-03-09|Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
Mohammed2015|Accuracy Evaluation of Digital Photogrammetric Projects
同族专利:
公开号 | 公开日
US10358235B2|2019-07-23|
KR20120092109A|2012-08-20|
AU2010219335A1|2010-09-30|
CA2773303C|2017-07-25|
US20100013927A1|2010-01-21|
JP2013505457A|2013-02-14|
CN102612636A|2012-07-25|
ZA201201800B|2013-05-29|
US8497905B2|2013-07-30|
KR101679456B1|2016-12-06|
RU2012116074A|2013-10-27|
JP2016180761A|2016-10-13|
CN102612636B|2015-05-06|
US20130235199A1|2013-09-12|
EP2480860A4|2017-08-30|
WO2011036541A1|2011-03-31|
CA2773303A1|2011-03-31|
AU2010219335B2|2015-03-05|
IN2012DN02253A|2015-08-21|
JP6321077B2|2018-05-09|
EP2480860A1|2012-08-01|
RU2562707C2|2015-09-10|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US1654070A|1920-05-14|1927-12-27|Edwin H Corlett|Method of and means for rectifying the scale and perspective of a picture|
US2989890A|1956-11-13|1961-06-27|Paramount Pictures Corp|Image matching apparatus|
US5345086A|1962-11-28|1994-09-06|Eaton Corporation|Automatic map compilation system|
DE2940871C2|1979-10-09|1983-11-10|Messerschmitt-Bölkow-Blohm GmbH, 8012 Ottobrunn|Photogrammetric method for aircraft and spacecraft for digital terrain display|
US4662588A|1982-04-15|1987-05-05|Charles Henderson|Airplane configured with a moveable disk structure|
US4671650A|1982-09-20|1987-06-09|Crane Co. |Apparatus and method for determining aircraft position and velocity|
US4802757A|1986-03-17|1989-02-07|Geospectra Corporation|System for determining the attitude of a moving imaging sensor platform or the like|
US5104217A|1986-03-17|1992-04-14|Geospectra Corporation|System for determining and controlling the attitude of a moving airborne or spaceborne platform or the like|
DE3802219A1|1988-01-26|1989-08-03|Deutsche Forsch Luft Raumfahrt|METHOD AND DEVICE FOR REMOTE DETECTION OF THE EARTH|
DE3802541C2|1988-01-28|1990-08-16|H. Dr. 8120 Weilheim De Schmidt Von Braun|
US4876651A|1988-05-11|1989-10-24|Honeywell Inc.|Digital map system|
US5073819A|1990-04-05|1991-12-17|Computer Scaled Video Surveys, Inc.|Computer assisted video surveying and method thereof|
US5259037A|1991-02-07|1993-11-02|Hughes Training, Inc.|Automated video imagery database generation using photogrammetry|
US5555018A|1991-04-25|1996-09-10|Von Braun; Heiko S.|Large-scale mapping of parameters of multi-dimensional structures in natural environments|
DE69123578T2|1991-09-05|1997-04-03|Nec Corp|Image acquisition system that can generate correct image signals of an object zone|
US5247356A|1992-02-14|1993-09-21|Ciampa John A|Method and apparatus for mapping and measuring land|
US5251037A|1992-02-18|1993-10-05|Hughes Training, Inc.|Method and apparatus for generating high resolution CCD camera images|
DE4216828C2|1992-05-21|1994-08-18|Dornier Gmbh|Earth observation procedures|
FR2696843B1|1992-10-14|1994-12-09|Matra Sep Imagerie Inf|High resolution remote camera for aerial carrier.|
US5757423A|1993-10-22|1998-05-26|Canon Kabushiki Kaisha|Image taking apparatus|
JP2807622B2|1993-12-13|1998-10-08|株式会社コア|Aircraft integrated photography system|
CA2190596C|1994-05-19|2002-03-26|Theodore M. Lachinski|Method for collecting and processing visual and spatial position information|
US5649032A|1994-11-14|1997-07-15|David Sarnoff Research Center, Inc.|System for automatically aligning images to form a mosaic image|
US5596494A|1994-11-14|1997-01-21|Kuo; Shihjong|Method and apparatus for acquiring digital maps|
JPH08237407A|1994-12-09|1996-09-13|Xerox Corp|Method of positioning relative alignment of picture tile andcorrecting penetrative distortion|
JP3653769B2|1995-02-07|2005-06-02|朝日航洋株式会社|Flow measuring method and apparatus|
US5604534A|1995-05-24|1997-02-18|Omni Solutions International, Ltd.|Direct digital airborne panoramic camera system and method|
US5963664A|1995-06-22|1999-10-05|Sarnoff Corporation|Method and system for image combination using a parallax-based technique|
US6122078A|1995-08-24|2000-09-19|Vexcel Imaging Gmbh|Self calibrating scanner with single or multiple detector arrays and single or multiple optical systems|
US6211906B1|1995-09-07|2001-04-03|Flight Landata, Inc.|Computerized component variable interference filter imaging spectrometer system method and apparatus|
US5790188A|1995-09-07|1998-08-04|Flight Landata, Inc.|Computer controlled, 3-CCD camera, airborne, variable interference filter imaging spectrometer system|
US5798923A|1995-10-18|1998-08-25|Intergraph Corporation|Optimal projection design and analysis|
US5894323A|1996-03-22|1999-04-13|Tasc, Inc,|Airborne imaging system using global positioning system and inertial measurement unit data|
US5798786A|1996-05-07|1998-08-25|Recon/Optical, Inc.|Electro-optical imaging detector array for a moving vehicle which includes two axis image motion compensation and transfers pixels in row directions and column directions|
US5844602A|1996-05-07|1998-12-01|Recon/Optical, Inc.|Electro-optical imaging array and camera system with pitch rate image motion compensation which can be used in an airplane in a dive bomb maneuver|
US5953054A|1996-05-31|1999-09-14|Geo-3D Inc.|Method and system for producing stereoscopic 3-dimensional images|
GB9622253D0|1996-10-25|1997-10-01|Council Cent Lab Res Councils|Camera system|
DE69739272D1|1996-11-05|2009-04-02|Bae Systems Information|Device for electro-optical remote sensing with motion compensation|
DE19714396A1|1997-04-08|1998-10-15|Zeiss Carl Fa|Photogrammetric camera used in aircraft or satellite|
US6078701A|1997-08-01|2000-06-20|Sarnoff Corporation|Method and apparatus for performing local to global multiframe alignment to construct mosaic images|
US6552744B2|1997-09-26|2003-04-22|Roxio, Inc.|Virtual reality camera|
AU1715199A|1997-12-22|1999-07-12|Eimar M. Boesjes|Acquisition and animation of surface detail images|
US6304284B1|1998-03-31|2001-10-16|Intel Corporation|Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera|
US20020063711A1|1999-05-12|2002-05-30|Imove Inc.|Camera system with high resolution image inside a wide angle view|
EP2309453A3|1998-07-31|2012-09-26|Panasonic Corporation|Image displaying apparatus and image displaying method|
US6269175B1|1998-08-28|2001-07-31|Sarnoff Corporation|Method and apparatus for enhancing regions of aligned images using flow estimation|
US6201897B1|1998-11-09|2001-03-13|Earth Resource Mapping|Transformation and selective inverse transformation of large digital images|
US6134297A|1998-12-09|2000-10-17|Advanced Optical Technologies, Inc.|Apparatus and method for removing scatter from an x-ray image using two-dimensional detectors and a single-energy spectrum x-ray source|
CA2268681A1|1999-04-12|2000-10-12|Verimap Plus Inc.|Optical imaging mount apparatus|
US6587601B1|1999-06-29|2003-07-01|Sarnoff Corporation|Method and apparatus for performing geo-spatial registration using a Euclidean representation|
US6255981B1|1999-08-04|2001-07-03|Raytheon Company|Method for range alignment and rotation correction of a high resolution image in an inverse synthetic aperture radar system|
US6694064B1|1999-11-19|2004-02-17|Positive Systems, Inc.|Digital aerial image mosaic method and apparatus|
JP4685313B2|1999-12-29|2011-05-18|ジオスパンコーポレイション|Method for processing passive volumetric image of any aspect|
US7019777B2|2000-04-21|2006-03-28|Flight Landata, Inc.|Multispectral imaging system with spatial resolution enhancement|
US6633688B1|2000-04-28|2003-10-14|Earth Resource Mapping, Inc.|Method system and apparatus for providing image data in client/server systems|
DE10034601B4|2000-07-14|2013-05-23|Leica Geosystems Ag|Camera system with at least two first and second cameras|
US6757445B1|2000-10-04|2004-06-29|Pixxures, Inc.|Method and apparatus for producing digital orthophotos using sparse stereo configurations and external models|
US6834234B2|2000-11-22|2004-12-21|Trimble Navigation, Limited|AINS land surveyor system with reprocessing, AINS-LSSRP|
JP3530978B2|2000-12-28|2004-05-24|鹿島建設株式会社|Image measurement method and recording medium recording image measurement program|
EP1370830A1|2001-03-13|2003-12-17|Tacshot, Inc.|Panoramic aerial imaging device|
EP3388784B1|2001-05-04|2019-07-17|Vexcel Imaging GmbH|Method and large format camera for acquiring a large format image of a large area object|
US6996254B2|2001-06-18|2006-02-07|Microsoft Corporation|Incremental motion estimation through local bundle adjustment|
US7509241B2|2001-07-06|2009-03-24|Sarnoff Corporation|Method and apparatus for automatically generating a site model|
US20030048357A1|2001-08-29|2003-03-13|Geovantage, Inc.|Digital imaging system for airborne applications|
US20040257441A1|2001-08-29|2004-12-23|Geovantage, Inc.|Digital imaging system for airborne applications|
US7149366B1|2001-09-12|2006-12-12|Flight Landata, Inc.|High-definition hyperspectral imaging system|
AU2003226047A1|2002-04-10|2003-10-27|Pan-X Imaging, Inc.|A digital imaging system|
JP4181800B2|2002-06-20|2008-11-19|Nec東芝スペースシステム株式会社|Topographic measurement system, storage medium, and program using stereo image|
US7259784B2|2002-06-21|2007-08-21|Microsoft Corporation|System and method for camera color calibration and image stitching|
DE10239523A1|2002-08-23|2004-03-04|Z/I Imaging Gmbh|Camera, camera arrangement and method for operating the camera and camera arrangement|
US20040041999A1|2002-08-28|2004-03-04|Hogan John M.|Method and apparatus for determining the geographic location of a target|
US7212938B2|2002-09-17|2007-05-01|M7 Visual Intelligence, Lp|Method of using a self-locking travel pattern to achieve calibration of remote sensors using conventionally collected data|
JP4191449B2|2002-09-19|2008-12-03|株式会社トプコン|Image calibration method, image calibration processing device, image calibration processing terminal|
US6928194B2|2002-09-19|2005-08-09|M7 Visual Intelligence, Lp|System for mosaicing digital ortho-images|
US7725258B2|2002-09-20|2010-05-25|M7 Visual Intelligence, L.P.|Vehicle based data collection and processing system and imaging sensor system and methods thereof|
EP1540937A4|2002-09-20|2008-11-12|M7 Visual Intelligence Lp|Vehicule based data collection and porcessing system|
US7424133B2|2002-11-08|2008-09-09|Pictometry International Corporation|Method and apparatus for capturing, geolocating and measuring oblique images|
EP1696204B1|2002-11-08|2015-01-28|Pictometry International Corp.|Method for capturing, geolocating and measuring oblique images|
US6859547B2|2003-01-25|2005-02-22|The Mostert Group|Methods and computer-readable medium for tracking motion|
IL155034D0|2003-03-23|2004-06-20|M A M D Digital Data Proc Syst|Automatic aerial digital photography and digital data processing systems|
JP2004328117A|2003-04-22|2004-11-18|Fuji Photo Film Co Ltd|Digital camera and photographing control method|
JP2004337232A|2003-05-13|2004-12-02|Canon Inc|Image managing method, apparatus and program therefor|
US20040250288A1|2003-06-05|2004-12-09|Palmerio Robert R.|Method and apparatus for storing surveillance films|
BRPI0508226A|2004-02-27|2007-07-17|Intergraph Software Tech Co|forming a single image from overlay images|
US8669988B2|2004-03-10|2014-03-11|Qualcomm Incorporated|High data rate interface apparatus and method|
JP2008506167A|2004-06-25|2008-02-28|デジタルグローブ インコーポレイテッド|Method and apparatus for determining a location associated with an image|
US20060007308A1|2004-07-12|2006-01-12|Ide Curtis E|Environmentally aware, intelligent surveillance device|
NZ552920A|2004-08-04|2009-12-24|Intergraph Software Tech Co|Method of preparing a composite image with non-uniform resolution real-time composite image comparator|
WO2006137829A2|2004-08-10|2006-12-28|Sarnoff Corporation|Method and system for performing adaptive image acquisition|
US7668402B2|2004-11-05|2010-02-23|Intergraph Technologies Company|Method for generating a composite image|
US20060119622A1|2004-11-23|2006-06-08|General Electric Company|Method and apparatus for volume rendering display protocol|
US7287701B2|2005-02-17|2007-10-30|The Boeing Company|Handheld coordinate reference system|
EP1736928A1|2005-06-20|2006-12-27|Mitsubishi Electric Information Technology Centre Europe B.V.|Robust image registration|
EP1920423A2|2005-09-01|2008-05-14|GeoSim Systems Ltd.|System and method for cost-effective, high-fidelity 3d-modeling of large-scale urban environments|
US7437062B2|2005-11-10|2008-10-14|Eradas, Inc.|Remote sensing system capable of coregistering data from sensors potentially having unique perspectives|
US7688438B2|2005-12-20|2010-03-30|Raytheon Company|Scanning solar diffuser relative reflectance monitor|
US7639897B2|2006-01-24|2009-12-29|Hewlett-Packard Development Company, L.P.|Method and apparatus for composing a panoramic photograph|
US20070188610A1|2006-02-13|2007-08-16|The Boeing Company|Synoptic broad-area remote-sensing via multiple telescopes|
US8616983B2|2006-05-05|2013-12-31|Aristocrat Technologies Austrailia Pty, Ltd|Gaming machine adapted to receive bill and ticket data|
US8160394B2|2006-05-11|2012-04-17|Intergraph Software Technologies, Company|Real-time capture and transformation of hemispherical video images to images in rectilinear coordinates|
US7310606B2|2006-05-12|2007-12-18|Harris Corporation|Method and system for generating an image-textured digital surface model for a geographical area of interest|
US20070291184A1|2006-06-16|2007-12-20|Michael Harville|System and method for displaying images|
JP4470926B2|2006-08-08|2010-06-02|国際航業株式会社|Aerial photo image data set and its creation and display methods|
US7873238B2|2006-08-30|2011-01-18|Pictometry International Corporation|Mosaic oblique images and methods of making and using same|
US20080089577A1|2006-10-11|2008-04-17|Younian Wang|Feature extraction from stereo imagery|
US10337862B2|2006-11-30|2019-07-02|Rafael Advanced Defense Systems Ltd.|Digital mapping system based on continuous scanning line of sight|
US8593518B2|2007-02-01|2013-11-26|Pictometry International Corp.|Computer system for continuous oblique panning|
US8520079B2|2007-02-15|2013-08-27|Pictometry International Corp.|Event multiplexer for managing the capture of images|
US8385672B2|2007-05-01|2013-02-26|Pictometry International Corp.|System for detecting image abnormalities|
US20090041368A1|2007-08-06|2009-02-12|Microsoft Corporation|Enhancing digital images using secondary optical systems|
US20090093959A1|2007-10-04|2009-04-09|Trimble Navigation Limited|Real-time high accuracy position and orientation system|
US7991226B2|2007-10-12|2011-08-02|Pictometry International Corporation|System and process for color-balancing a series of oblique images|
US8531472B2|2007-12-03|2013-09-10|Pictometry International Corp.|Systems and methods for rapid three-dimensional modeling with real façade texture|
US8497905B2|2008-04-11|2013-07-30|nearmap australia pty ltd.|Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features|
US8675068B2|2008-04-11|2014-03-18|Nearmap Australia Pty Ltd|Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features|US8675068B2|2008-04-11|2014-03-18|Nearmap Australia Pty Ltd|Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features|
US8497905B2|2008-04-11|2013-07-30|nearmap australia pty ltd.|Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features|
US8866920B2|2008-05-20|2014-10-21|Pelican Imaging Corporation|Capturing and processing of images using monolithic camera array with heterogeneous imagers|
JP2011523538A|2008-05-20|2011-08-11|ペリカンイメージングコーポレイション|Image capture and processing using monolithic camera arrays with different types of imagers|
US8253815B2|2008-09-16|2012-08-28|Altia Systems Inc.|Synchronized multiple imager system and method|
US8514491B2|2009-11-20|2013-08-20|Pelican Imaging Corporation|Capturing and processing of images using monolithic camera array with heterogeneous imagers|
US8542286B2|2009-11-24|2013-09-24|Microsoft Corporation|Large format digital camera with multiple optical systems and detector arrays|
US8616884B1|2009-12-01|2013-12-31|The Boeing Company|Integrated live and simulation environment system for an aircraft|
US9230446B1|2009-12-01|2016-01-05|The Boeing Company|Integrated live and simulation environment system for an aircraft|
FR2953940B1|2009-12-16|2012-02-03|Thales Sa|METHOD FOR GEO-REFERENCING AN IMAGE AREA|
EP2529183A4|2010-01-25|2014-03-12|Tarik Ozkul|Autonomous decision system for selecting target in observation satellites|
US8971567B2|2010-03-05|2015-03-03|Digimarc Corporation|Reducing watermark perceptibility and extending detection distortion tolerances|
US10664940B2|2010-03-05|2020-05-26|Digimarc Corporation|Signal encoding to reduce perceptibility of changes over time|
JP5848754B2|2010-05-12|2016-01-27|ペリカン イメージング コーポレイション|Architecture for imager arrays and array cameras|
EP2423871B1|2010-08-25|2014-06-18|Lakeside Labs GmbH|Apparatus and method for generating an overview image of a plurality of images using an accuracy information|
US8986011B1|2010-09-13|2015-03-24|The Boeing Company|Occlusion server for an integrated live and simulation environment for an aircraft|
WO2012044297A1|2010-09-30|2012-04-05|Empire Technology Development Llc|Automatic flight control for uav based solid modeling|
US8842168B2|2010-10-29|2014-09-23|Sony Corporation|Multi-view video and still 3D capture system|
US8878950B2|2010-12-14|2014-11-04|Pelican Imaging Corporation|Systems and methods for synthesizing high resolution images using super-resolution processes|
US8616883B2|2010-12-15|2013-12-31|The Boeing Company|Simulation control system for an integrated live and simulation environment for an aircraft|
JP5775354B2|2011-04-28|2015-09-09|株式会社トプコン|Takeoff and landing target device and automatic takeoff and landing system|
KR101973822B1|2011-05-11|2019-04-29|포토네이션 케이맨 리미티드|Systems and methods for transmitting and receiving array camera image data|
EP2527787B1|2011-05-23|2019-09-11|Kabushiki Kaisha TOPCON|Aerial photograph image pickup method and aerial photograph image pickup apparatus|
US20130265459A1|2011-06-28|2013-10-10|Pelican Imaging Corporation|Optical arrangements for use with an array camera|
US9183638B2|2011-08-09|2015-11-10|The Boeing Company|Image based position determination|
US9337949B2|2011-08-31|2016-05-10|Cablecam, Llc|Control system for an aerially moved payload|
US9477141B2|2011-08-31|2016-10-25|Cablecam, Llc|Aerial movement system having multiple payloads|
US20130070060A1|2011-09-19|2013-03-21|Pelican Imaging Corporation|Systems and methods for determining depth from multiple views of a scene that include aliasing using hypothesized fusion|
EP2761534B1|2011-09-28|2020-11-18|FotoNation Limited|Systems for encoding light field image files|
GB2495528B|2011-10-12|2014-04-02|Hidef Aerial Surveying Ltd|Aerial imaging array|
IL216515A|2011-11-22|2015-02-26|Israel Aerospace Ind Ltd|System and method for processing multi-camera array images|
JP5882693B2|2011-11-24|2016-03-09|株式会社トプコン|Aerial photography imaging method and aerial photography imaging apparatus|
FR2985307B1|2012-01-03|2015-04-03|Centre Nat Etd Spatiales|METHOD OF CALIBRATING THE BANDS OF ALIGNMENT OF AN EARTH OBSERVATION SYSTEM UTILIZING SYMMETRICAL SIGHTS|
IL217432A|2012-01-09|2015-11-30|Rafael Advanced Defense Sys|Method and apparatus for aerial surveillance|
US8953012B2|2012-02-13|2015-02-10|Raytheon Company|Multi-plenoptic system with image stacking and method for wide field-of-regard high-resolution imaging|
EP2817955B1|2012-02-21|2018-04-11|FotoNation Cayman Limited|Systems and methods for the manipulation of captured light field image data|
EP2817965A4|2012-02-22|2015-09-16|Systems and methods for accessing camera systems|
US9210392B2|2012-05-01|2015-12-08|Pelican Imaging Coporation|Camera modules patterned with pi filter groups|
US9609284B2|2012-05-22|2017-03-28|Otoy, Inc.|Portable mobile light stage|
CN104508681B|2012-06-28|2018-10-30|Fotonation开曼有限公司|For detecting defective camera array, optical device array and the system and method for sensor|
US20140002674A1|2012-06-30|2014-01-02|Pelican Imaging Corporation|Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors|
EP3869797A1|2012-08-21|2021-08-25|FotoNation Limited|Method for depth detection and correction in images captured using array cameras|
WO2014032020A2|2012-08-23|2014-02-27|Pelican Imaging Corporation|Feature based high resolution motion estimation from low resolution images captured using an array source|
JP6122591B2|2012-08-24|2017-04-26|株式会社トプコン|Photogrammetry camera and aerial photography equipment|
US9214013B2|2012-09-14|2015-12-15|Pelican Imaging Corporation|Systems and methods for correcting user identified artifacts in light field images|
EP2901671A4|2012-09-28|2016-08-24|Pelican Imaging Corp|Generating images from light fields utilizing virtual viewpoints|
EP2904544B1|2012-10-05|2020-01-22|Trimble Inc.|Enhanced bundle adjustment techniques|
JP6055274B2|2012-10-31|2016-12-27|株式会社トプコン|Aerial photograph measuring method and aerial photograph measuring system|
US9143711B2|2012-11-13|2015-09-22|Pelican Imaging Corporation|Systems and methods for array camera focal plane control|
US9235763B2|2012-11-26|2016-01-12|Trimble Navigation Limited|Integrated aerial photogrammetry surveys|
US20140160231A1|2012-12-12|2014-06-12|Daniel C. Middleton|Multi-focal image capture and display|
US9462164B2|2013-02-21|2016-10-04|Pelican Imaging Corporation|Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information|
US9374512B2|2013-02-24|2016-06-21|Pelican Imaging Corporation|Thin form factor computational array cameras and modular array cameras|
WO2014138697A1|2013-03-08|2014-09-12|Pelican Imaging Corporation|Systems and methods for high dynamic range imaging using array cameras|
US8866912B2|2013-03-10|2014-10-21|Pelican Imaging Corporation|System and methods for calibration of an array camera using a single captured image|
US9521416B1|2013-03-11|2016-12-13|Kip Peli P1 Lp|Systems and methods for image data compression|
US9106784B2|2013-03-13|2015-08-11|Pelican Imaging Corporation|Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing|
WO2014164909A1|2013-03-13|2014-10-09|Pelican Imaging Corporation|Array camera architecture implementing quantum film sensors|
WO2014164550A2|2013-03-13|2014-10-09|Pelican Imaging Corporation|System and methods for calibration of an array camera|
US9519972B2|2013-03-13|2016-12-13|Kip Peli P1 Lp|Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies|
WO2014153098A1|2013-03-14|2014-09-25|Pelican Imaging Corporation|Photmetric normalization in array cameras|
WO2014159779A1|2013-03-14|2014-10-02|Pelican Imaging Corporation|Systems and methods for reducing motion blur in images or video in ultra low light with array cameras|
US10122993B2|2013-03-15|2018-11-06|Fotonation Limited|Autofocus system for a conventional camera that uses depth information from an array camera|
US9633442B2|2013-03-15|2017-04-25|Fotonation Cayman Limited|Array cameras including an array camera module augmented with a separate camera|
US9497429B2|2013-03-15|2016-11-15|Pelican Imaging Corporation|Extended color processing on pelican array cameras|
JP2016524125A|2013-03-15|2016-08-12|ペリカン イメージング コーポレイション|System and method for stereoscopic imaging using a camera array|
WO2014150856A1|2013-03-15|2014-09-25|Pelican Imaging Corporation|Array camera implementing quantum dot color filters|
US9445003B1|2013-03-15|2016-09-13|Pelican Imaging Corporation|Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information|
EP2787319A1|2013-04-05|2014-10-08|Leica Geosystems AG|Control of an image triggering system for taking aerial photographs in nadir alignment for an unmanned aircraft|
US9499285B2|2013-06-28|2016-11-22|Darren D. Garber|Three dimensional imaging arrangement|
US9903719B2|2013-09-03|2018-02-27|Litel Instruments|System and method for advanced navigation|
CN103438867B|2013-09-06|2016-03-02|中测新图遥感技术有限责任公司|Small-sized unmanned aerial vehicle onboard looks three-dimensional airborne remote sensing system more|
US9269014B2|2013-09-24|2016-02-23|Corning Incorporated|Hyperspectral detector systems and methods using context-image fusion|
US9536148B2|2013-09-27|2017-01-03|Real Data Guru, Inc.|Property assessment and prospecting tool|
WO2015048694A2|2013-09-27|2015-04-02|Pelican Imaging Corporation|Systems and methods for depth-assisted perspective distortion correction|
WO2015070105A1|2013-11-07|2015-05-14|Pelican Imaging Corporation|Methods of manufacturing array camera modules incorporating independently aligned lens stacks|
US10119808B2|2013-11-18|2018-11-06|Fotonation Limited|Systems and methods for estimating depth from projected texture using camera arrays|
EP3075140B1|2013-11-26|2018-06-13|FotoNation Cayman Limited|Array camera configurations incorporating multiple constituent array cameras|
KR101368325B1|2013-12-10|2014-02-28|한국지질자원연구원|Aerial survey plane with airborne infrared imagery camera lens protect cover|
CN103646384B|2013-12-20|2016-06-22|江苏大学|A kind of optimization method of remotely-sensed scanning imaging platform flight speed|
KR101429166B1|2013-12-27|2014-08-13|대한민국|Imaging system on aerial vehicle|
DE102014201238A1|2014-01-23|2015-07-23|Siemens Aktiengesellschaft|Method and system for creating a vector map|
EP3107367A4|2014-02-21|2017-11-01|Blue River Technology Inc.|System and method for automated odometry calibration for precision algriculture systems|
US10089740B2|2014-03-07|2018-10-02|Fotonation Limited|System and methods for depth regularization and semiautomatic interactive matting using RGB-D images|
US9247117B2|2014-04-07|2016-01-26|Pelican Imaging Corporation|Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array|
US9824276B2|2014-04-15|2017-11-21|Open Range Consulting|System and method for assessing rangeland|
US9688403B2|2014-05-20|2017-06-27|Infatics, Inc.|Method for adaptive mission execution on an unmanned aerial vehicle|
KR102165450B1|2014-05-22|2020-10-14|엘지전자 주식회사|The Apparatus and Method for Portable Device controlling Unmanned Aerial Vehicle|
US9521319B2|2014-06-18|2016-12-13|Pelican Imaging Corporation|Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor|
US9046759B1|2014-06-20|2015-06-02|nearmap australia pty ltd.|Compact multi-resolution aerial camera system|
US9440750B2|2014-06-20|2016-09-13|nearmap australia pty ltd.|Wide-area aerial camera systems|
US9641736B2|2014-06-20|2017-05-02|nearmap australia pty ltd.|Wide-area aerial camera systems|
US9052571B1|2014-06-20|2015-06-09|nearmap australia pty ltd.|Wide-area aerial camera systems|
US9185290B1|2014-06-20|2015-11-10|Nearmap Australia Pty Ltd|Wide-area aerial camera systems|
FR3023435B1|2014-07-04|2016-07-01|Thales Sa|METHOD FOR OBSERVING A REGION OF THE GROUND SURFACE, PARTICULARLY LOCATED WITH HIGH LATITUDES; SOIL STATION AND SATELLITE SYSTEM FOR IMPLEMENTING SAID METHOD|
IL233684A|2014-07-17|2018-01-31|Shamir Hanan|Stabilization and display of remote images|
US20160080702A1|2014-09-11|2016-03-17|Gabriel Shachor|Systems and methods for controlling multiple aerial units|
CN113256730A|2014-09-29|2021-08-13|快图有限公司|System and method for dynamic calibration of an array camera|
US9773155B2|2014-10-14|2017-09-26|Microsoft Technology Licensing, Llc|Depth from time of flight camera|
FR3030091B1|2014-12-12|2018-01-26|Airbus Operations|METHOD AND SYSTEM FOR AUTOMATICALLY DETECTING A DISALLIATION IN OPERATION OF A MONITORING SENSOR OF AN AIRCRAFT.|
US10963749B2|2014-12-12|2021-03-30|Cox Automotive, Inc.|Systems and methods for automatic vehicle imaging|
US10043104B2|2015-01-05|2018-08-07|Avigilon Fortress Corporation|Automatic moving object verification|
US10964226B2|2015-01-19|2021-03-30|The Boeing Company|Instructional assessment system for a vehicle|
US9824290B2|2015-02-10|2017-11-21|nearmap australia pty ltd.|Corridor capture|
GB201506329D0|2015-04-14|2015-05-27|Vito Nv|System and method for processing images of a ground surface|
US9942474B2|2015-04-17|2018-04-10|Fotonation Cayman Limited|Systems and methods for performing high speed video capture and depth estimation using array cameras|
RU2597024C1|2015-05-05|2016-09-10|Владимир Германович Андронов|Method for rapid determination of angular elements of external orientation of space scanner photograph|
US9948914B1|2015-05-06|2018-04-17|The United States Of America As Represented By The Secretary Of The Air Force|Orthoscopic fusion platform|
FR3038482B1|2015-06-30|2017-08-11|Parrot|CAMERA BLOCK CAPABLE OF INBOARDING A DRONE FOR MAPPING A FIELD AND METHOD OF MANAGING IMAGE CAPTURE BY A CAMERA BLOCK|
US9852645B2|2015-08-17|2017-12-26|The Boeing Company|Global positioning systemindependent navigation system for a self-guided aerial vehicle utilizing multiple optical sensors|
RU2616103C2|2015-09-11|2017-04-12|Валентина Николаевна Панфилова|Automated method of charting road traffic accident by using global positioning system and cameras|
CN106558181B|2015-09-28|2019-07-30|东莞前沿技术研究院|Fire monitoring method and apparatus|
CN105180898B|2015-09-29|2018-11-23|南京工程学院|A kind of full-automatic topographic map plotting board and its mapping method|
US9829886B2|2016-01-05|2017-11-28|ZEROTECHIntelligence Robot Co., Ltd.|Flight control system, a circuit board assembly and a configuration method thereof|
DE102016000810A1|2016-01-26|2017-07-27|Diehl Defence Gmbh & Co. Kg|Method for determining a position of an object|
US10361719B2|2016-03-02|2019-07-23|Spookfish Innovations Pty Ltd.|Method of managing data captured in an aerial camera system|
EP3470779A4|2016-06-08|2019-07-03|Sony Corporation|Imaging control device and method, and vehicle|
PL3293115T3|2016-09-07|2020-04-30|Siemens Aktiengesellschaft|Method for controlling of unmanned aerial vehicles|
CN106341586A|2016-10-14|2017-01-18|安徽协创物联网技术有限公司|Panorama camera with triaxial holder|
CN106772314B|2016-12-09|2019-04-26|哈尔滨工业大学|The airborne mapping laser radar broom type scanning system of one kind and its scan method|
RU2644630C1|2016-12-13|2018-02-13|Акционерное общество "Научно-исследовательский институт современных телекоммуникационных технологий" |Method of aerophotography of terrestrial objects in conditions of insufficient lighting with help of unmanned aircraft|
US10621780B2|2017-02-02|2020-04-14|Infatics, Inc.|System and methods for improved aerial mapping with aerial vehicles|
US11069254B2|2017-04-05|2021-07-20|The Boeing Company|Method for simulating live aircraft infrared seeker obscuration during live, virtual, constructiveexercises|
US10482618B2|2017-08-21|2019-11-19|Fotonation Limited|Systems and methods for hybrid depth regularization|
US10586349B2|2017-08-24|2020-03-10|Trimble Inc.|Excavator bucket positioning via mobile device|
CN107560603B|2017-08-29|2020-06-09|南宁慧视科技有限责任公司|Unmanned aerial vehicle oblique photography measurement system and measurement method|
CN107784633B|2017-09-04|2021-04-13|黄仁杰|Unmanned aerial vehicle aerial image calibration method suitable for plane measurement|
CN108318007B|2018-01-26|2020-11-10|广州市红鹏直升机遥感科技有限公司|Shooting method of spliced aerial oblique photography|
CN108489469A|2018-03-20|2018-09-04|重庆交通大学|A kind of monocular distance measuring device and method|
CN108469254A|2018-03-21|2018-08-31|南昌航空大学|A kind of more visual measuring system overall calibration methods of big visual field being suitable for looking up and overlooking pose|
KR101945019B1|2018-06-22|2019-02-01|전북대학교 산학협력단|System for swarm flight of unmanned aerial vehicles for acquiring images of crop growing distribution and method thereof|
FR3087037B1|2018-10-03|2021-06-04|Soletanche Freyssinet|IMAGE ACQUISITION PROCESS|
EP3683647A4|2018-11-21|2020-09-02|Guangzhou Xaircraft Technology Co., Ltd|Surveying sample point planning method and apparatus, control terminal, and storage medium|
RU2706250C1|2018-12-11|2019-11-15|Федеральное государственное бюджетное общеобразовательное учреждение высшего образования "Ковровская государственная технологическая академия имени В.А. Дегтярева"|Ground vehicle navigation method|
US20200191568A1|2018-12-17|2020-06-18|Paul Lapstun|Multi-View Aerial Imaging|
CN110375715B|2019-07-15|2020-08-21|哈尔滨工业大学|Wide-area key target confirmation method and device applied to small satellite and computer storage medium|
JP2021027469A|2019-08-06|2021-02-22|アルパイン株式会社|Image processing device, image processing method, and image processing program|
WO2021035608A1|2019-08-29|2021-03-04|深圳市大疆创新科技有限公司|Route generation method, ground apparatus, unmanned aerial vehicle, system, and storage medium|
US11270110B2|2019-09-17|2022-03-08|Boston Polarimetrics, Inc.|Systems and methods for surface modeling using polarization cues|
US10943360B1|2019-10-24|2021-03-09|Trimble Inc.|Photogrammetric machine measure up|
WO2021151047A1|2020-01-23|2021-07-29|Impossible Objects, Inc.|Camera-based monitoring system for 3-dimensional printing|
CN112640419A|2020-02-28|2021-04-09|深圳市大疆创新科技有限公司|Following method, movable platform, device and storage medium|
CN111366137B|2020-04-26|2022-02-01|湖州市南浔创业测绘与土地规划院股份有限公司|Unmanned aerial vehicle combined terrain surveying and mapping method|
法律状态:
2020-10-27| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-11-10| B25D| Requested change of name of applicant approved|Owner name: NEARMAP AUSTRALIA PTY LTD (AU) |
2021-02-09| B11B| Dismissal acc. art. 36, par 1 of ipl - no reply within 90 days to fullfil the necessary requirements|
2021-11-23| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US12/565,232|US8497905B2|2008-04-11|2009-09-23|Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features|
US12/565,232|2009-09-23|
PCT/IB2010/002380|WO2011036541A1|2009-09-23|2010-09-22|Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features|
[返回顶部]